Head-order Techniques and Other Pragmatics of Lambda Calculus Graph Reduction
نویسنده
چکیده
In this dissertation Lambda Calculus reduction is studied as a means of improving the support for declarative computing. We consider systems having reduction semantics; i.e., systems in which computations consist of equivalence-preserving transformations between expressions. The approach becomes possible by reducing expressions beyond weak normal form, allowing expression-level output values, and avoiding compilation-centered transformations. In particular, we develop reduction algorithms which, although not optimal, are highly efficient. A minimal linear notation for lambda expressions and for certain runtime structures is introduced for explaining operational features. This notation is related to recent theories which formalize the notion of substitution. Our main reduction technique is Berkling’s Head Order Reduction (HOR), a delayed substitution algorithm which emphasizes the extended left spine. HOR uses the de Bruijn representation for variables and a mechanism for artificially binding relatively free variables. HOR produces a lazy variant of the head normal form, the natural midway point of reduction. It is shown that beta reduction in the scope of relative free variables is not hard. Full normalization suggests new applications by not relegating partial evaluation to a meta level. Variations of HOR are presented, including a conservative “breadth-first” one which takes advantage of the inherent parallelism of the head normal form. A reduction system must be capable of sharing intermediate results. Sharing under HOR has not received attention to date. In this dissertation variations of HOR which achieve sharing are described. Sharing is made possible via the special treatment of expressions referred to by head variables. The reduction strategy is based on normal order, achieves low reduction counts, but is shown to be incomplete. Head Order Reduction with and without sharing, as well as other competing algorithms are evaluated on several test sets. Our results indicate that reduction rates in excess of one million reductions/second can be achieved on current processors in interpretive mode and with minimal preand post-processing. By extending the efficient algorithms for the pure calculus presented in this dissertation with primitives and data structures it is now possible to build useful reduction systems. We present some suggestions on how such systems can be designed.
منابع مشابه
A Fine - Grained Notation for Lambda
We discuss issues relevant to the practical use of a previously proposed notation for lambda terms in contexts where the intensions of such terms have to be manipulated. This notation uses thènameless' scheme of de Bruijn, includes expressions for encoding terms together with substitutions to be performed on them and contains a mechanism for combining such substitutions so that they can be eeec...
متن کاملOn the Invariance of the Unitary Cost Model for Head Reduction
The λ-calculus is a widely accepted computational model of higher-order functional programs, yet there is not any direct and universally accepted cost model for it. As a consequence, the computational difficulty of reducing λ-terms to their normal form is typically studied by reasoning on concrete implementation algorithms. In this paper, we show that when head reduction is the underlying dynam...
متن کاملHead reduction and normalization in a call-by-value lambda-calculus
Recently, a standardization theorem has been proven for a variant of Plotkin’s call-by-value lambda-calculus extended by means of two commutation rules (sigma-reductions): this result was based on a partitioning between head and internal reductions. We study the head normalization for this call-by-value calculus with sigma-reductions and we relate it to the weak evaluation of original Plotkin’s...
متن کاملThe Montagovian generative lexicon ΛTyn: an integrated type-theoretical framework for compositional semantics and lexical pragmatics
In the recent years, different but similar approaches integrated lexical semantics within compositional semantics. These approaches all make use of type theory at least to compose the meanings. We present here with some details one of these approaches, ours, which makes use of second order lambda calculus as a type theory for meaning assembly and of multi sorted higher order predicate logic for...
متن کاملOn the Relation between the λμ-Calculus and the Syntactic Theory of Sequential Control
We construct a translation of first order λμ-calculus [15] into a subtheory of Felleisen’s λc-calculus [5, 6]. This translation preserves typing and reduction. Then, by constructing the inverse translation, we show that the two calculi are actually isomorphic.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002